k-Nearest Neighbor Learning with Graph Neural Networks

نویسندگان

چکیده

k-nearest neighbor (kNN) is a widely used learning algorithm for supervised tasks. In practice, the main challenge when using kNN its high sensitivity to hyperparameter setting, including number of nearest neighbors k, distance function, and weighting function. To improve robustness hyperparameters, this study presents novel method based on graph neural network, named kNNGNN. Given training data, learns task-specific rule in an end-to-end fashion by means network that takes instance predict label instance. The functions are implicitly embedded within network. For query instance, prediction obtained performing search from data create passing it through effectiveness proposed demonstrated various benchmark datasets classification regression

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Approximate Nearest-Neighbor Search with k-Nearest Neighbor Graph

We introduce a new nearest neighbor search algorithm. The algorithm builds a nearest neighbor graph in an offline phase and when queried with a new point, performs hill-climbing starting from a randomly sampled node of the graph. We provide theoretical guarantees for the accuracy and the computational complexity and empirically show the effectiveness of this algorithm.

متن کامل

Percolation in the k-nearest neighbor graph

Let P be a Poisson process of intensity one in R2. For a fixed integer k, join every point of P to its k nearest neighbors, creating a directed random geometric graph ~ Gk(R). We prove bounds on the values of k that, almost surely, result in an infinite connected component in ~ Gk(R) for various definitions of “component”. We also give high confidence results for the exact values of k needed. I...

متن کامل

k-Nearest Neighbor Augmented Neural Networks for Text Classification

In recent years, many deep-learning based models are proposed for text classification. This kind of models well fits the training set from the statistical point of view. However, it lacks the capacity of utilizing instance-level information from individual instances in the training set. In this work, we propose to enhance neural network models by allowing them to leverage information from k-nea...

متن کامل

Pruned Bi-directed K-nearest Neighbor Graph for Proximity Search

In this paper, we address the problems with fast proximity searches for high-dimensional data by using a graph as an index. Graphbased methods that use the k-nearest neighbor graph (KNNG) as an index perform better than tree-based and hash-based methods in terms of search precision and query time. To further improve the performance of the KNNG, the number of edges should be increased. However, ...

متن کامل

Fast PNN-based Clustering Using K-nearest Neighbor Graph

Search for nearest neighbor is the main source of computation in most clustering algorithms. We propose the use of nearest neighbor graph for reducing the number of candidates. The number of distance calculations per search can be reduced from O(N) to O(k) where N is the number of clusters, and k is the number of neighbors in the graph. We apply the proposed scheme within agglomerative clusteri...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2021

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math9080830